Approximating Joint Probability Distributions Given Partial Information

نویسندگان

  • Luis V. Montiel
  • J. Eric Bickel
چکیده

I n this paper, we propose new methods to approximate probability distributions that are incompletely specified. We compare these methods to the use of maximum entropy and quantify the accuracy of all methods within the context of an illustrative example. We show that within the context of our example, the methods we propose are more accurate than existing methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning

Approximating a divergence between two probability distributions from their samples is a fundamental challenge in statistics, information theory, and machine learning. A divergence approximator can be used for various purposes such as two-sample homogeneity testing, change-point detection, and class-balance estimation. Furthermore, an approximator of a divergence between the joint distribution ...

متن کامل

A Hyperexponential Approximation to Finite-Time and Infinite-Time Ruin Probabilities of Compound Poisson Processes

This article considers the problem of evaluating infinite-time (or finite-time) ruin probability under a given compound Poisson surplus process by approximating the claim size distribution by a finite mixture exponential, say Hyperexponential, distribution. It restates the infinite-time (or finite-time) ruin probability as a solvable ordinary differential equation (or a partial differential equ...

متن کامل

Generating a Random Collection of Discrete Joint Probability Distributions Subject to Partial Information

In this paper, we develop a practical and flexible methodology for generating a random collection of discrete joint probability distributions, subject to a specified information set, which can be expressed as a set of linear constraints (e.g., marginal assessments, moments, or pairwise correlations). Our approach begins with the construction of a polytope using this set of linear constraints. T...

متن کامل

Discovering a junction tree behind a Markov network by a greedy algorithm

Abstract In our paper [18] we introduced a special kind of k-width junction tree, called k-th order t-cherry junction tree in order to approximate a joint probability distribution. The approximation is the best if the Kullback-Leibler divergence between the true joint probability distribution and the approximating one is minimal. Finding the best approximating k-width junction tree is NP-comple...

متن کامل

On the complexity of computational problems regarding distributions (a survey)∗

We consider two basic computational problems regarding discrete probability distributions: (1) approximating the statistical difference (aka variation distance) between two given distributions, and (2) approximating the entropy of a given distribution. Both problems are considered in two different settings. In the first setting the approximation algorithm is only given samples from the distribu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Decision Analysis

دوره 10  شماره 

صفحات  -

تاریخ انتشار 2013